316 research outputs found

    Desire and Subjectivity in Twentieth Century American Poetry

    Get PDF
    Many studies of American poetry view modernism as an eruption of formal and technical innovations that respond to momentous cultural and political changes, but few attempt to consider the flow and restriction of desire among these changes. This dissertation argues that American modernist poets construct models of desire based on the rejection of sensual objects and a subsequent redirection of desire toward the self and the creative mind. In addition, these models of desire result in a conception of subjects as whole, discrete, and isolated. In the first chapter, I distinguish between Walt Whitman's sensualist model of desire and Emily Dickinson's intellectualist mode that defers satisfaction. I contend that Ezra Pound, Wallace Stevens, T. S. Eliot, and H.D. (Hilda Doolittle) develop from Dickinson's perspective of deferred satisfaction to an outright rejection of physical desire. The manner and implications of this reorganization of desire differ among these poets, as do the poetic techniques they utilize, but underlying these differences is a related refusal to pursue objects of sensual pleasure. Pound withdraws desire from the world by turning objects into static images; desire is then able to flourish in the creative mind. Stevens allows the imagination to remake the world, creating manifold abstractions for subjects who otherwise reject sensuality. The second chapter provides a close reading of Eliot's The Waste Land to show how the presentation of sexual futility leads to a poetic experience of separation as a means of spiritual reformation. The third chapter reads H.D.'s Trilogy as a contemplation of the destruction of World War II and the persistent, unified self that outlasts it. Rather than interacting with this devastated world, H.D. insists that desire must be redirected toward the effort of spiritual redemption. In the fourth chapter, Elizabeth Bishop begins to question the deliberate rejection of the world. She sees a world that reasserts itself and imagines a subject who, though still yearning for unity, must admit an inescapably physical environment. The conclusion considers how postwar American poets continue to dissolve the subject and release desire into the world, emphasizing the present moment rather than a lasting, unified self

    A Quantum Theory of Temporally Mismatched Homodyne Measurements with Applications to Optical Frequency Comb Metrology

    Full text link
    The fields of precision timekeeping and spectroscopy increasingly rely on optical frequency comb interferometry. However, comb-based measurements are not described by existing quantum theory because they exhibit both large mode mismatch and finite strength local oscillators. To establish this quantum theory, we derive measurement operators for homodyne with arbitrary mode overlap. These operators are a combination of quadrature and intensity-like measurements, which inform a filter that maximizes the quadrature measurement signal-to-noise ratio. Furthermore, these operators establish a foundation to extend frequency-comb interferometry to a wide range of scenarios, including metrology with nonclassical states of light.Comment: 5 pages plus appendice

    LightBox: Full-stack Protected Stateful Middlebox at Lightning Speed

    Full text link
    Running off-site software middleboxes at third-party service providers has been a popular practice. However, routing large volumes of raw traffic, which may carry sensitive information, to a remote site for processing raises severe security concerns. Prior solutions often abstract away important factors pertinent to real-world deployment. In particular, they overlook the significance of metadata protection and stateful processing. Unprotected traffic metadata like low-level headers, size and count, can be exploited to learn supposedly encrypted application contents. Meanwhile, tracking the states of 100,000s of flows concurrently is often indispensable in production-level middleboxes deployed at real networks. We present LightBox, the first system that can drive off-site middleboxes at near-native speed with stateful processing and the most comprehensive protection to date. Built upon commodity trusted hardware, Intel SGX, LightBox is the product of our systematic investigation of how to overcome the inherent limitations of secure enclaves using domain knowledge and customization. First, we introduce an elegant virtual network interface that allows convenient access to fully protected packets at line rate without leaving the enclave, as if from the trusted source network. Second, we provide complete flow state management for efficient stateful processing, by tailoring a set of data structures and algorithms optimized for the highly constrained enclave space. Extensive evaluations demonstrate that LightBox, with all security benefits, can achieve 10Gbps packet I/O, and that with case studies on three stateful middleboxes, it can operate at near-native speed.Comment: Accepted at ACM CCS 201

    Decentralised Edge-Computing and IoT through Distributed Trust

    Get PDF
    The emerging Internet of Things needs edge-computing - this is an established fact. In turn, edge computing needs infrastructure decentralisation. What is not necessarily established yet is that infrastructure decentralisation needs a distributed model of Internet governance and decentralised trust schemes. We discuss the features of a decentralised IoT and edge-computing ecosystem and list the components that need to be designed, as well the challenges that need to be addressed

    Teechain: a secure payment network with asynchronous blockchain access

    Get PDF
    Blockchains such as Bitcoin and Ethereum execute payment transactions securely, but their performance is limited by the need for global consensus. Payment networks overcome this limitation through off-chain transactions. Instead of writing to the blockchain for each transaction, they only settle the final payment balances with the underlying blockchain. When executing off-chain transactions in current payment networks, parties must access the blockchain within bounded time to detect misbehaving parties that deviate from the protocol. This opens a window for attacks in which a malicious party can steal funds by deliberately delaying other parties' blockchain access and prevents parties from using payment networks when disconnected from the blockchain. We present Teechain, the first layer-two payment network that executes off-chain transactions asynchronously with respect to the underlying blockchain. To prevent parties from misbehaving, Teechain uses treasuries, protected by hardware trusted execution environments (TEEs), to establish off-chain payment channels between parties. Treasuries maintain collateral funds and can exchange transactions efficiently and securely, without interacting with the underlying blockchain. To mitigate against treasury failures and to avoid having to trust all TEEs, Teechain replicates the state of treasuries using committee chains, a new variant of chain replication with threshold secret sharing. Teechain achieves at least a 33X higher transaction throughput than the state-of-the-art Lightning payment network. A 30-machine Teechain deployment can handle over 1 million Bitcoin transactions per second

    Genetic association study of QT interval highlights role for calcium signaling pathways in myocardial repolarization.

    Get PDF
    The QT interval, an electrocardiographic measure reflecting myocardial repolarization, is a heritable trait. QT prolongation is a risk factor for ventricular arrhythmias and sudden cardiac death (SCD) and could indicate the presence of the potentially lethal mendelian long-QT syndrome (LQTS). Using a genome-wide association and replication study in up to 100,000 individuals, we identified 35 common variant loci associated with QT interval that collectively explain ∼8-10% of QT-interval variation and highlight the importance of calcium regulation in myocardial repolarization. Rare variant analysis of 6 new QT interval-associated loci in 298 unrelated probands with LQTS identified coding variants not found in controls but of uncertain causality and therefore requiring validation. Several newly identified loci encode proteins that physically interact with other recognized repolarization proteins. Our integration of common variant association, expression and orthogonal protein-protein interaction screens provides new insights into cardiac electrophysiology and identifies new candidate genes for ventricular arrhythmias, LQTS and SCD

    Plasma proteomic signatures of a direct measure of insulin sensitivity in two population cohorts

    Get PDF
    Aims/hypothesis: The euglycemic hyperinsulinemic clamp (EIC) is a direct measure and the reference-standard in the assessment of whole-body insulin sensitivity but is laborious and expensive to perform. We aimed to assess the incremental value of high-throughput plasma proteomic profiling in developing signatures correlating with the M-value derived from the EIC. Methods: We measured 828 proteins in the fasting plasma of 966 participants from the Relationship between Insulin Sensitivity and Cardiovascular disease (RISC) study and 745 participants from the Uppsala Longitudinal Study of Adult Men (ULSAM) using a high-throughput proximity extension assay. We used the least absolute shrinkage and selection operator (LASSO) approach using clinical variables and protein measures as features. Models were tested within and across cohorts. Our primary model performance metric was the proportion of the M-value variance explained (R2 82 ). Results: A standard LASSO model incorporating 53 proteins in addition to routinely available clinical variables increased the M-value R2 85 from 0.237 (95% confidence interval: 0.178-0.303) to 0.456 (0.372-0.536) in RISC. A similar pattern was observed in ULSAM in which the M-value R2 increased from 0.443 (0.360-0.530) to 0.632 (0.569-0.698) with the addition of 61 proteins. Models trained in one cohort and tested in the other also demonstrated significant improvements in R2 despite differences in baseline cohort characteristics and clamp methodology: RISC to ULSAM: 0.491 (0.433-0.539) for 51 proteins, ULSAM to RISC: 0.369 (0.331-0.416) for 67 proteins. A randomized LASSO and stability selection algorithm selected only two proteins per cohort (three unique proteins) which improved R2 92 but to a lesser degree than standard LASSO models: 0.352 (0.266-0.439) within RISC and 0.495 (0.404-0.585) within ULSAM. Differences in R2 93 explained between randomized and standard LASSO were notably reduced in the cross-cohort analyses despite the much smaller number of proteins selected: RISC to ULSAM range 0.444 (0.391-0.497) ULSAM to RISC range 0.348 (0.300-0.396). Models of proteins alone were as effective as models that included both clinical variables and proteins using either standard or randomized LASSO. The single most consistently selected protein across all analyses and models was IGFBP2. Conclusions/interpretation: A plasma proteomic signature identified through a standard LASSO approach improves the cross-sectional estimation of the M-value over routine clinical variables. However, a small subset of these proteins identified using stability selection algorithm affords much of this improvement especially when considering cross-cohort analyses. Our approach provides opportunities to improve the identification of insulin resistant individuals at risk of IR-related adverse health consequences
    corecore